Then we'll go on to the next topic, which is neural networks, which is probably what
you've been waiting for.
But the problem is that neural networks, which we'll basically understand as brain-inspired
computation, will actually be something where we take nerve cells, which you probably remember
from high school again.
Nerve cells are things that have a body that does some kind of a computation.
And importantly, these nerve cells are interconnected.
And they have a kind of input-output behavior.
You get...
Correct me if I'm wrong.
I always get this wrong.
The dendrites are essentially the inputs.
And then you have one very long, typically, output channel that kind of connects to other
cells, to dendrites of the other cells.
That's the axon.
And what you basically do is you collect input, do something to it, and then decide via this
computation to send out signals that are then picked up by other nerve cells.
That's how the brain works.
The nice thing about the brain is that we have loads of those, 10 to the 11 neurons
in a brain.
And they're extremely highly connected.
Basically you have per cell about 1,000, on average, connections to other cells.
So we have a network of highly connected little computational units.
And we have cycles in the millisecond range.
And the signals are essentially changes of the electrical potential.
And with that, we compute.
So far, this is all something you should already know.
So not surprisingly, given that the only agents that are truly intelligent, they use computational
devices, AKA brains, that are not like computers.
Computers use hardware that is based on silicon.
We have logical gates in them.
They're connected to relatively few other logical gates, one or two typically, with
very high, very small switching times.
And we do everything to make sure that they are deterministic.
The chip is typically thrown away if it becomes indeterministic, whether there's a one or
a two there, a one or a zero there.
You measure for that, you discard it if that happens.
Brains work completely differently.
We have this noisy computations, we have very high connectivity rates, and we have very
slow computation, but lots of computational units.
Now it's pretty easy to have the idea, well, if you want intelligence, do it like a brain.
And actually, the hope is that once you start building artificial brains, intelligence comes
along naturally.
Just like with babies, you have to do quite a lot so they don't become intelligent, at
least to a degree.
And that's kind of the hope of the AI subfield of neural networks, which runs under a couple
of names.
Connectionism is the ism way of saying that.
If you want to say it in a more computer science-y way, you say parallel distributed processing
or neural computation, where that is kind of incognitive science, they call it that.
Presenters
Zugänglich über
Offener Zugang
Dauer
00:19:21 Min
Aufnahmedatum
2021-03-30
Hochgeladen am
2021-03-30 17:16:33
Sprache
en-US
The artificial approach to build a biological neuron, the McCulloch-Pitts unit and how to implement logical functions as units.